Goto

Collaborating Authors

 dimensionality reduction


Model-based targeted dimensionality reduction for neuronal population data

Neural Information Processing Systems

Summarizing high-dimensional data using a small number of parameters is a ubiquitous first step in the analysis of neuronal population activity. Recently developed methods use targeted approaches that work by identifying multiple, distinct low-dimensional subspaces of activity that capture the population response to individual experimental task variables, such as the value of a presented stimulus or the behavior of the animal. These methods have gained attention because they decompose total neural activity into what are ostensibly different parts of a neuronal computation. However, existing targeted methods have been developed outside of the confines of probabilistic modeling, making some aspects of the procedures ad hoc, or limited in flexibility or interpretability. Here we propose a new model-based method for targeted dimensionality reduction based on a probabilistic generative model of the population response data.


Scaling Gaussian Process Regression with Derivatives

David Eriksson, Kun Dong, Eric Lee, David Bindel, Andrew G. Wilson

Neural Information Processing Systems

Computing the model fit term, as well as the predictive moments of the GP, requires solving linear systems with the kernel matrix, while the complexity term, or Occam'sfactor[18],isthelogdeterminant ofthekernelmatrix.





704cddc91e28d1a5517518b2f12bc321-AuthorFeedback.pdf

Neural Information Processing Systems

We thank the reviewers for their feedback. We will first respond to shared and then to individual comments. Additionally, reviewers 2 and 3 requested clarification regarding the advantages of DCA over other methods. For instance, one could attempt to correlate each neuron's contribution to the DCA subspace with single-neuron Studying the behavior of Kernel DCA is a direction for future studies. Additionally, we found and corrected a minor bug in Figure 1A: the SFA and DCA lines are now blue and red, respectively.





organization in the final version. 2 Reviewer

Neural Information Processing Systems

We thank the reviewers for their valuable feedback. This is the standard definition based on least square interpolation. We shall clarify this appropriately in the final draft. "I can't see many weaknesses, apart from the complexity matters: I would be curious to have some insights on this "This paper proposed to utilize the kernel embedding method to reduce the dimensionality of the statistical optimal This work does not propose to perform dimensionality reduction. "The proposed kernel embedding formulation of OT is based on the cross-covariance operator which ignores higher No, the higher order moments are not ignored. Q5) "The dimensionality of canonical feature maps can be infinity.